|
In mathematics, division by zero is division where the divisor (denominator) is zero. Such a division can be formally expressed as ''a''/0 where ''a'' is the dividend (numerator). In ordinary arithmetic, the expression has no meaning, as there is no number which, multiplied by 0, gives ''a'' (assuming ''a''≠0), and so division by zero is undefined. Since any number multiplied by zero is zero, the expression 0/0 also has no defined value; when it is the form of a limit, it is an indeterminate form. Historically, one of the earliest recorded references to the mathematical impossibility of assigning a value to ''a''/0 is contained in George Berkeley's criticism of infinitesimal calculus in ''The Analyst'' ("ghosts of departed quantities").〔.〕 There are mathematical structures in which ''a''/0 is defined for some ''a'' such as in the Riemann sphere and the real projective line; however, such structures cannot satisfy every ordinary rule of arithmetic (the field axioms). In computing, a program error may result from an attempt to divide by zero. Depending on the programming environment and the type of number (e.g. floating point, integer) being divided by zero, it may generate positive or negative infinity by the IEEE 754 floating point standard, generate an exception, generate an error message, cause the program to terminate, or result in a special not-a-number value. ==Elementary arithmetic== When division is explained at the elementary arithmetic level, it is often considered as splitting a set of objects into equal parts. As an example, consider having ten cookies, and these cookies are to be distributed equally to five people at a table. Each person would receive = 2 cookies. Similarly, if there are ten cookies, and only one person at the table, that person would receive = 10 cookies. So, for dividing by zero, what is the number of cookies that each person receives when 10 cookies are evenly distributed amongst 0 people at a table? Certain words can be pinpointed in the question to highlight the problem. The problem with this question is the "when". There is no way to evenly distribute 10 cookies to nobody. In mathematical jargon, a set of 10 items cannot be partitioned into 0 subsets. So , at least in elementary arithmetic, is said to be either meaningless, or undefined. Similar problems occur if one has 0 cookies and 0 people, but this time the problem is in the phrase "the number". A partition is possible (of a set with 0 elements into 0 parts), but since the partition has 0 parts, vacuously every set in our partition has a given number of elements, be it 0, 2, 5, or 1000. If there are, say, 5 cookies and 2 people, the problem is in "evenly distribute". In any integer partition of a 5-set into 2 parts, one of the parts of the partition will have more elements than the other. But the problem with 5 cookies and 2 people can be solved by cutting one cookie in half. The problem with 5 cookies and 0 people cannot be solved in any way that preserves the meaning of "divides". Another way of looking at division by zero is that division can always be checked using multiplication. Considering the 10/0 example above, setting x = 10/0, if ''x'' equals ten divided by zero, then ''x'' times zero equals ten, but there is no ''x'' that, when multiplied by zero, gives ten (or any other number than zero). If instead of x=10/0 we have x=0/0, then every ''x'' satisfies the question 'what number x, multiplied by zero, gives zero?' 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「division by zero」の詳細全文を読む スポンサード リンク
|